494 research outputs found
James' Conjecture for Hecke algebras of exceptional type, I
In this paper, and a second part to follow, we complete the programme
(initiated more than 15 years ago) of determining the decomposition numbers and
verifying James' Conjecture for Iwahori--Hecke algebras of exceptional type.
The new ingredients which allow us to achieve this aim are:
- the fact, recently proved by the first author, that all Hecke algebras of
finite type are cellular in the sense of Graham--Lehrer, and
- the explicit determination of -graphs for the irreducible (generic)
representations of Hecke algebras of type and by Howlett and Yin.
Thus, we can reduce the problem of computing decomposition numbers to a
manageable size where standard techniques, e.g., Parker's {\sf MeatAxe} and its
variations, can be applied. In this part, we describe the theoretical
foundations for this procedure.Comment: 24 pages; corrected some misprints, added Remark 4.1
Assessment Methods for Innovative Operational Measures and Technologies for Intermodal Freight Terminals
The topic of freight transport by rail, is a complex theme and, in recent years, a main issue of European policy. The legislation evolution
and the White Paper 2011 have demonstrated the European intention to re-launch this sector. The challenge is to promote the intermodal
transport system to the detriment of road freight transport. In this context, the intermodal freight terminals play a primary role for the
supply chain, they are the connection point between the various transport nodes and the nodal points where the freight are handled,
stored and transferred between different modes to final customer. To achieve the purpose, proposed by the EC, are necessary the
performances improvement of existing intermodal freight terminals and the development of innovative intermodal freight terminals.
Many terminal performances improvement is have been proposed and sometime experimented. They are based both on operational
measures (e.g. horizontal and parallel handling, faster and fully direct handling) and on innovative technologies (e.g. automatic system
for horizontal and parallel handling, automated gate for data exchange) inside the terminals, with often-contradictory results. The
research work described in this paper (developed within the EU project Capacity4Rail) focusses on the assessment of effects that these
innovations can have in the intermodal freight terminals. The innovative operational measures and technologies have been combined in
different scenarios, to be evaluated by a methodological approach including to other an analytical methods and simulation models. The
output of this assessment method are key performance indicators (KPI) setup according to terminals typologies the proposals and related
to different aspects (e.g. management, operation and organization. In the present work suitable KPIs (e.g. total/partial transit times) for to
evaluate have been applied. Finally, in addition to methodological framework illustrated, a real case of study will be illustrated: the
intermodal rail-road freight terminal Munich-Riem (Germany)
Potential Capabilities of Lunar Laser Ranging for Geodesy and Relativity
Lunar Laser Ranging (LLR), which has been carried out for more than 35 years,
is used to determine many parameters within the Earth-Moon system. This
includes coordinates of terrestrial ranging stations and that of lunar
retro-reflectors, as well as lunar orbit, gravity field, and its tidal
acceleration. LLR data analysis also performs a number of gravitational physics
experiments such as test of the equivalence principle, search for time
variation of the gravitational constant, and determines value of several metric
gravity parameters. These gravitational physics parameters cause both secular
and periodic effects on the lunar orbit that are detectable with LLR.
Furthermore, LLR contributes to the determination of Earth orientation
parameters (EOP) such as nutation, precession (including relativistic
precession), polar motion, and UT1. The corresponding LLR EOP series is three
decades long. LLR can be used for the realization of both the terrestrial and
selenocentric reference frames. The realization of a dynamically defined
inertial reference frame, in contrast to the kinematically realized frame of
VLBI, offers new possibilities for mutual cross-checking and confirmation.
Finally, LLR also investigates the processes related to the Moon's interior
dynamics. Here, we review the LLR technique focusing on its impact on Geodesy
and Relativity. We discuss the modern observational accuracy and the level of
existing LLR modeling. We present the near-term objectives and emphasize
improvements needed to fully utilize the scientific potential of LLR.Comment: 7 pages, 7 figures, 2 tables. Talk given at `Dynamic Planet 2005:
Monitoring and Understanding a Dynamic Planet with Geodetic and Oceanographic
Tools,'' a Joint Assembly of International Associations: IAG, IAPSO and IABO,
Cairns, Australia, 22-26 August 200
Adaptive kNN using Expected Accuracy for Classification of Geo-Spatial Data
The k-Nearest Neighbor (kNN) classification approach is conceptually simple -
yet widely applied since it often performs well in practical applications.
However, using a global constant k does not always provide an optimal solution,
e.g., for datasets with an irregular density distribution of data points. This
paper proposes an adaptive kNN classifier where k is chosen dynamically for
each instance (point) to be classified, such that the expected accuracy of
classification is maximized. We define the expected accuracy as the accuracy of
a set of structurally similar observations. An arbitrary similarity function
can be used to find these observations. We introduce and evaluate different
similarity functions. For the evaluation, we use five different classification
tasks based on geo-spatial data. Each classification task consists of (tens of)
thousands of items. We demonstrate, that the presented expected accuracy
measures can be a good estimator for kNN performance, and the proposed adaptive
kNN classifier outperforms common kNN and previously introduced adaptive kNN
algorithms. Also, we show that the range of considered k can be significantly
reduced to speed up the algorithm without negative influence on classification
accuracy
Age Correction in Dementia – Matching to a Healthy Brain
In recent research, many univariate and multivariate approaches have been proposed to improve automatic classification of various dementia syndromes using imaging data. Some of these methods do not provide the possibility to integrate possible confounding variables like age into the statistical evaluation. A similar problem sometimes exists in clinical studies, as it is not always possible to match different clinical groups to each other in all confounding variables, like for example, early-onset (age<65 years) and late-onset (age≥65) patients with Alzheimer's disease (AD). Here, we propose a simple method to control for possible effects of confounding variables such as age prior to statistical evaluation of magnetic resonance imaging (MRI) data using support vector machine classification (SVM) or voxel-based morphometry (VBM). We compare SVM results for the classification of 80 AD patients and 79 healthy control subjects based on MRI data with and without prior age correction. Additionally, we compare VBM results for the comparison of three different groups of AD patients differing in age with the same group of control subjects obtained without including age as covariate, with age as covariate or with prior age correction using the proposed method. SVM classification using the proposed method resulted in higher between-group classification accuracy compared to uncorrected data. Further, applying the proposed age correction substantially improved univariate detection of disease-related grey matter atrophy using VBM in AD patients differing in age from control subjects. The results suggest that the approach proposed in this work is generally suited to control for confounding variables such as age in SVM or VBM analyses. Accordingly, the approach might improve and extend the application of these methods in clinical neurosciences
Assessing the Effects of Enhanced Supply Chain Visibility Through RFID
The majority of RFID implementations can be traced back either to mandates issued by companies or institutions with significant market power like Wal-Mart or the U.S. Department of Defense, or to the replacement of existing Auto-ID technologies like barcodes. Only sporadically is RFID being used to derive superior information about current processes in order to create supply chain visibility. In this contribution, we examine the visibility potentials of RFID technology within the context of SCM and we propose a four-step approach to assessing the results that can be achieved through visibility
MapperMania: A Framework for Native Multi-Tenancy Business Object Mapping to a Persistent Data Source
The Software-as-a-Service delivery model bears new challenges for application developers. Especially in the context of enterprise resource planning software targeting the SME market, new problems arise. Most of them agglomerate around the occurrence of multi-tenancy. This paper describes the framework MapperMania which aims to establish an abstraction layer between the persistence layer and the domain model. Leveraging MapperMania, the domain model is able to abstract from multi-tenancy and changes in the underlying infrastructure
- …